Sparse Legendre expansions via l1-minimization
نویسندگان
چکیده
We consider the problem of recovering polynomials that are sparse with respect to the basis of Legendre polynomials from a small number of random samples. In particular, we show that a Legendre s-sparse polynomial of maximal degree N can be recovered fromm s log(N) random samples that are chosen independently according to the Chebyshev probability measure dν(x) = π−1(1 − x2)−1/2dx. As an efficient recovery method, `1-minimization can be used. We establish these results by verifying the restricted isometry property of a preconditioned random Legendre matrix. We then extend these results to a large class of orthogonal polynomial systems, including the Jacobi polynomials, of which the Legendre polynomials are a special case. Finally, we transpose these results into the setting of approximate recovery for functions in certain infinite-dimensional function spaces.
منابع مشابه
Exact Recovery for Sparse Signal via Weighted l_1 Minimization
Numerical experiments in literature on compressed sensing have indicated that the reweighted l1 minimization performs exceptionally well in recovering sparse signal. In this paper, we develop exact recovery conditions and algorithm for sparse signal via weighted l1 minimization from the insight of the classical NSP (null space property) and RIC (restricted isometry constant) bound. We first int...
متن کاملRecovery of signals by a weighted $\ell_2/\ell_1$ minimization under arbitrary prior support information
In this paper, we introduce a weighted l2/l1 minimization to recover block sparse signals with arbitrary prior support information. When partial prior support information is available, a sufficient condition based on the high order block RIP is derived to guarantee stable and robust recovery of block sparse signals via the weighted l2/l1 minimization. We then show if the accuracy of arbitrary p...
متن کاملIdentification of Sparse Operators
[1] J.-L. Guermond and B. Popov, Linear advection with ill-posed boundary conditions via L1-minimization, Numerical Analysis and Modeling 4 (2007), 39–47. [2] J.-L. Guermond and B. Popov, L1-minimization methods for Hamilton-Jacobi equations: the one-dimensional case, submitted. [3] J.-L. Guermond and B. Popov, L1-approximation of stationary Hamilton-Jacobi equations, submitted. [4] J.-L. Guerm...
متن کاملOn recovery of sparse signals via l1 minimization
This article considers constrained l1 minimization methods for the recovery of high dimensional sparse signals in three settings: noiseless, bounded error and Gaussian noise. A unified and elementary treatment is given in these noise settings for two l1 minimization methods: the Dantzig selector and l1 minimization with an l2 constraint. The results of this paper improve the existing results in...
متن کاملNumerical Studies of the Generalized l1 Greedy Algorithm for Sparse Signals
The generalized l1 greedy algorithm was recently introduced and used to reconstruct medical images in computerized tomography in the compressed sensing framework via total variation minimization. Experimental results showed that this algorithm is superior to the reweighted l1-minimization and l1 greedy algorithms in reconstructing these medical images. In this paper the effectiveness of the gen...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Journal of Approximation Theory
دوره 164 شماره
صفحات -
تاریخ انتشار 2012